41 research outputs found

    One Digital Health is FAIR

    Get PDF
    The One Digital Health framework aims at transforming future health ecosystems and guiding the implementation of a digital technologies-based systemic approach to caring for humans' and animals' health in a managed surrounding environment. To integrate and to use the data generated by the ODH data sources, "FAIRness" stands as a prerequisite for proper data management and stewardship

    European Federation of Medical Informatics Participation in European Projects – HosmartAI Project as an Experience

    Get PDF
    The European Federation for Medical Informatics Association (EFMI) is the leading organisation in medical informatics in Europe and represents 29 countries, RSMI being an active and valuable member. EFMI is organized as a nonprofit organisation concerned with the theory and practice of Information Science and Technology within Health and Health Science in a European context. The objectives when founded in 1976 were: to advance international cooperation and dissemination of information in Medical Informatics on a European basis; to promote high standards in the application of medical informatics; to promote research and development in medical informatics; to encourage high standards in education in medical informatics; to function as the autonomous European Regional Council of IMIA. Since 2018 EFMI participates actively in European financed projects contributing with its members’ expertise in dissemination, education, data processing, user experience, and several other domains. HosmartAI (2020-2024) is a H2020 project with 24 partners, 12 EU countries, 10 mil Euros funding. In the HosmartAI project – AI for the Smart Hospital of the future - EFMI is leading in WP6 the T6.3 task - Standardization and Legislation, T6.4 task -  Certification, Staff training & education and alignment with existing practice and had a consistent contribution in WP2 with EFMI MIMO tool. The EFMI team will present the solutions developed during the project and invite audience to give feedback

    Network and systems medicine: Position paper of the European Collaboration on Science and Technology action on Open Multiscale Systems Medicine

    Get PDF
    Introduction: Network and systems medicine has rapidly evolved over the past decade, thanks to computational and integrative tools, which stem in part from systems biology. However, major challenges and hurdles are still present regarding validation and translation into clinical application and decision making for precision medicine. Methods: In this context, the Collaboration on Science and Technology Action on Open Multiscale Systems Medicine (OpenMultiMed) reviewed the available advanced technologies for multidimensional data generation and integration in an open-science approach as well as key clinical applications of network and systems medicine and the main issues and opportunities for the future. Results: The development of multi-omic approaches as well as new digital tools provides a unique opportunity to explore complex biological systems and networks at different scales. Moreover, the application of findable, applicable, interoperable, and reusable principles and the adoption of standards increases data availability and sharing for multiscale integration and interpretation. These innovations have led to the first clinical applications of network and systems medicine, particularly in the field of personalized therapy and drug dosing. Enlarging network and systems medicine application would now imply to increase patient engagement and health care providers as well as to educate the novel generations of medical doctors and biomedical researchers to shift the current organ- and symptom-based medical concepts toward network- and systems-based ones for more precise diagnoses, interventions, and ideally prevention. Conclusion: In this dynamic setting, the health care system will also have to evolve, if not revolutionize, in terms of organization and management

    Common Limitations of Image Processing Metrics:A Picture Story

    Get PDF
    While the importance of automatic image analysis is continuously increasing, recent meta-research revealed major flaws with respect to algorithm validation. Performance metrics are particularly key for meaningful, objective, and transparent performance assessment and validation of the used automatic algorithms, but relatively little attention has been given to the practical pitfalls when using specific metrics for a given image analysis task. These are typically related to (1) the disregard of inherent metric properties, such as the behaviour in the presence of class imbalance or small target structures, (2) the disregard of inherent data set properties, such as the non-independence of the test cases, and (3) the disregard of the actual biomedical domain interest that the metrics should reflect. This living dynamically document has the purpose to illustrate important limitations of performance metrics commonly applied in the field of image analysis. In this context, it focuses on biomedical image analysis problems that can be phrased as image-level classification, semantic segmentation, instance segmentation, or object detection task. The current version is based on a Delphi process on metrics conducted by an international consortium of image analysis experts from more than 60 institutions worldwide.Comment: This is a dynamic paper on limitations of commonly used metrics. The current version discusses metrics for image-level classification, semantic segmentation, object detection and instance segmentation. For missing use cases, comments or questions, please contact [email protected] or [email protected]. Substantial contributions to this document will be acknowledged with a co-authorshi

    Understanding metric-related pitfalls in image analysis validation

    Get PDF
    Validation metrics are key for the reliable tracking of scientific progress and for bridging the current chasm between artificial intelligence (AI) research and its translation into practice. However, increasing evidence shows that particularly in image analysis, metrics are often chosen inadequately in relation to the underlying research problem. This could be attributed to a lack of accessibility of metric-related knowledge: While taking into account the individual strengths, weaknesses, and limitations of validation metrics is a critical prerequisite to making educated choices, the relevant knowledge is currently scattered and poorly accessible to individual researchers. Based on a multi-stage Delphi process conducted by a multidisciplinary expert consortium as well as extensive community feedback, the present work provides the first reliable and comprehensive common point of access to information on pitfalls related to validation metrics in image analysis. Focusing on biomedical image analysis but with the potential of transfer to other fields, the addressed pitfalls generalize across application domains and are categorized according to a newly created, domain-agnostic taxonomy. To facilitate comprehension, illustrations and specific examples accompany each pitfall. As a structured body of information accessible to researchers of all levels of expertise, this work enhances global comprehension of a key topic in image analysis validation.Comment: Shared first authors: Annika Reinke, Minu D. Tizabi; shared senior authors: Paul F. J\"ager, Lena Maier-Hei

    One Digital Health for more FAIRness

    No full text
    Background  One Digital Health (ODH) aims to propose a framework that merges One Health's and Digital Health's specific features into an innovative landscape. FAIR (Findable, Accessible, Interoperable, and Reusable) principles consider applications and computational agents (or, in other terms, data, metadata, and infrastructures) as stakeholders with the capacity to find, access, interoperate, and reuse data with none or minimal human intervention. Objectives  This paper aims to elicit how the ODH framework is compliant with FAIR principles and metrics, providing some thinking guide to investigate and define whether adapted metrics need to be figured out for an effective ODH Intervention setup. Methods  An integrative analysis of the literature was conducted to extract instances of the need—or of the eventual already existing deployment—of FAIR principles, for each of the three layers (keys, perspectives and dimensions) of the ODH framework. The scope was to assess the extent of scatteredness in pursuing the many facets of FAIRness, descending from the lack of a unifying and balanced framework. Results  A first attempt to interpret the different technological components existing in the different layers of the ODH framework, in the light of the FAIR principles, was conducted. Although the mature and working examples of workflows for data FAIRification processes currently retrievable in the literature provided a robust ground to work on, a nonsuitable capacity to fully assess FAIR aspects for highly interconnected scenarios, which the ODH-based ones are, has emerged. Rooms for improvement are anyway possible to timely deal with all the underlying features of topics like the delivery of health care in a syndemic scenario, the digital transformation of human and animal health data, or the digital nature conservation through digital technology-based intervention. Conclusions  ODH pillars account for the availability (findability, accessibility) of human, animal, and environmental data allowing a unified understanding of complex interactions (interoperability) over time (reusability). A vision of integration between these two worlds, under the vest of ODH Interventions featuring FAIRness characteristics, toward the development of a systemic lookup of health and ecology in a digitalized way, is therefore auspicable

    Intergenerational knowledge management in a cutting-edge Israeli industry: Visions and challenges.

    No full text
    Knowledge management is a multifaceted, complex, end-to-end organizational process dealing with collecting and using data, information, and knowledge generated by a group of individuals. The current study examines the changes required in companies' quality systems to enhance intergenerational learning and knowledge retention. Our primary objective was to understand the factors that influence the development of an organizational culture encouraging innovation, knowledge sharing, organizational learning, openness, and providing opportunities to create up-to-date knowledge. We collected the viewpoints and needs of industry professionals by using interviews and a survey. Then, we analyzed the factors that influence knowledge management quality and transfer between workforce generations. The professionals' primary goal is to introduce, integrate, and improve knowledge in their organization. Their second goal is to facilitate knowledge sharing and transfer between workforce generations. Improving transgenerational knowledge sharing and reducing the loss of knowledge are challenges for all industries. A cutting-edge industry such as the defense field deals with sensitive data, and knowledge management is a strategic need in a competitive context. Quality management standards propose guidelines for developing and enhancing the overall knowledge-related processes. However, implementing them requires a shift in the corporate culture team. Organizational knowledge resilience must be developed by involving the workforce in implementing knowledge management systems
    corecore